A Web Application Vulnerability Detection Method Based on Web Crawler Technology
نویسندگان
چکیده
منابع مشابه
A Web Crawler System Design Based on Distributed Technology
A practical distributed web crawler architecture is designed. The distributed cooperative grasping algorithm is put forward to solve the problem of distributed Web Crawler grasping. Log structure and Hash structure are combined and a large-scale web store structure is devised, which can meet not only the need of a large amount of random accesses, but also the need of newly added pages. Experime...
متن کاملReinforcement-Based Web Crawler
This paper presents a focused web crawler system which automatically creates a minority language corpora. The system uses a database of relevant and irrelevant documents testing the relevance of retrieved web documents. The system requires a starting web document to indicate where the search would begin.
متن کاملPriority based Semantic Web Crawler
The Internet has billions of web pages and these web pages are attached to each other using URL(Uniform Resource Allocation). Web crawler is a main module of Search engine that gathers these documents from WWW. Most of the web pages present on Internet are active and changes periodically. Thus, Crawler is required to update these web pages to update database of search engine. In this paper, pri...
متن کاملWeb Crawler: A Review
Information Retrieval deals with searching and retrieving information within the documents and it also searches the online databases and internet. Web crawler is defined as a program or software which traverses the Web and downloads web documents in a methodical, automated manner. Based on the type of knowledge, web crawler is usually divided in three types of crawling techniques: General Purpo...
متن کاملReview Paper on Web Crawler
Web crawler is software or a computer program which will be used for the browsing in World Wide Web in an ordered manner. The methodology used for this type of procedure is known as Web crawling or spidering.The different search engines used for spidering will give you current information. Web crawlers will create the copy of all the visited web pages that is used by the search engine as a refe...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computer Science and Application
سال: 2016
ISSN: 2161-8801,2161-881X
DOI: 10.12677/csa.2016.66042